Mutual information is critically dependent on prior assumptions: would the correct estimate of mutual information please identify itself?

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Mutual information is critically dependent on prior assumptions: would the correct estimate of mutual information please identify itself?

MOTIVATION Mutual information (MI) is a quantity that measures the dependence between two arbitrary random variables and has been repeatedly used to solve a wide variety of bioinformatic problems. Recently, when attempting to quantify the effects of sampling variance on computed values of MI in proteins, we encountered striking differences among various novel estimates of MI. These differences ...

متن کامل

On Classification of Bivariate Distributions Based on Mutual Information

Among all measures of independence between random variables, mutual information is the only one that is based on information theory. Mutual information takes into account of all kinds of dependencies between variables, i.e., both the linear and non-linear dependencies. In this paper we have classified some well-known bivariate distributions into two classes of distributions based on their mutua...

متن کامل

Clustering of a Number of Genes Affecting in Milk Production using Information Theory and Mutual Information

Information theory is a branch of mathematics. Information theory is used in genetic and bioinformatics analyses and can be used for many analyses related to the biological structures and sequences. Bio-computational grouping of genes facilitates genetic analysis, sequencing and structural-based analyses. In this study, after retrieving gene and exon DNA sequences affecting milk yield in dairy ...

متن کامل

Mutual information is copula entropy

In information theory, mutual information (MI) is a difference concept with entropy.[1] In this paper, we prove with copula [2] that they are essentially same – mutual information is also a kind of entropy, called copula entropy. Based on this insightful result, We propose a simple method for estimating mutual information. Copula is a theory on dependence and measurement of association.[2] Skla...

متن کامل

The kernel mutual information

indeed, we demonstrate that the KGV can also be thought of as We introduce a new contrast function, the kemel mutual information (KMIj, to measure the degree of independence of continuous random variables. This contrast function provides an approximate upper bound on the mutual information, as measured near independence, and is based on a kernel density estimate of the mutual information betwee...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Bioinformatics

سال: 2010

ISSN: 1367-4803,1460-2059

DOI: 10.1093/bioinformatics/btq497